A Fixed-point Proximity Approach to Solving the Support Vector Regression with the Group Lasso Regularization

نویسندگان

  • ZHENG LI
  • GUOHUI SONG
  • Ben-yu Guo
چکیده

We introduce an optimization model of the support vector regression with the group lasso regularization and develop a class of efficient two-step fixed-point proximity algorithms to solve it numerically. To overcome the difficulty brought by the non-differentiability of the group lasso regularization term and the loss function in the proposed model, we characterize its solutions as fixed-points of a nonlinear map defined in terms of the proximity operators of the functions appearing in the objective function of the model. We then propose a class of two-step fixedpoint algorithms to solve numerically the optimization problem based on the fixed-point equation. We establish convergence results of the proposed algorithms. Numerical experiments with both synthetic data and real-world benchmark data are presented to demonstrate the advantages of the proposed model and algorithms.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Mammalian Eye Gene Expression Using Support Vector Regression to Evaluate a Strategy for Detecting Human Eye Disease

Background and purpose: Machine learning is a class of modern and strong tools that can solve many important problems that nowadays humans may be faced with. Support vector regression (SVR) is a way to build a regression model which is an incredible member of the machine learning family. SVR has been proven to be an effective tool in real-value function estimation. As a supervised-learning appr...

متن کامل

On Sparsity Inducing Regularization Methods for Machine Learning

During the past years there has been an explosion of interest in learning methods based on sparsity regularization. In this paper, we discuss a general class of such methods, in which the regularizer can be expressed as the composition of a convex function ω with a linear function. This setting includes several methods such the group Lasso, the Fused Lasso, multi-task learning and many more. We...

متن کامل

. L G ] 7 A pr 2 01 1 Efficient First Order Methods for Linear Composite

A wide class of regularization problems in machine learning and statistics employ a regularization term which is obtained by composing a simple convex function ω with a linear transformation. This setting includes Group Lasso methods, the Fused Lasso and other total variation methods, multi-task learning methods and many more. In this paper, we present a general approach for computing the proxi...

متن کامل

Efficient First Order Methods for Linear Composite Regularizers

A wide class of regularization problems in machine learning and statistics employ a regularization term which is obtained by composing a simple convex function ω with a linear transformation. This setting includes Group Lasso methods, the Fused Lasso and other total variation methods, multi-task learning methods and many more. In this paper, we present a general approach for computing the proxi...

متن کامل

Some Observations on Sparsity Inducing Regularization Methods for Machine Learning

During the past years there has been an explosion of interest in learning method based on sparsity regularization. In this paper, we discuss a general class of such methods, in which the regularizer can be expressed as the composition of a convex function ω with a linear function. This setting includes several methods such the group Lasso, the Fused Lasso, multi-task learning and many more. We ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017